Combined Ordered and Improved Trajectories for Large Scale Human Action Recognition

نویسندگان

  • O. V. Ramana Murthy
  • Roland Goecke
چکیده

Recently, a video representation based on dense trajectories has been shown to outperform other human action recognition methods on several benchmark datasets. The trajectories capture the motion characteristics of different objects, for example human bodies, in spatial and temporal dimensions. In dense trajectories, points are sampled at uniform intervals in space and time and then tracked using a dense optical flow field over a fixed length time window of L frames (optimally 15) overlapping over the entire video. However, amongst these base trajectories, some continue for longer than duration L. These longer motion characteristics of objects may be more valuable than the information from the base trajectories or at least provide complementary information otherwise not captured. Thertefore, we propose a technique that searches for trajectories with longer duration and call these ‘ordered trajectories’. We apply these ordered trajectories in conjunction with the recent ‘improved trajectories’ (improved dense trajectories) approach on the UCF101 dataset. 1. Ordered Trajectories The overall layout of our proposed framework is shown in Fig. 1. Firstly, dense trajectories [4] are detected. The dense trajectories code available online1 [4] was used in all our experiments. The ‘ordered trajectories’ have been recently proposed by [2]. They were proposed to select those trajectories from, the base dense trajectories, that have a longer duration. Dense trajectories are usually computed over every overlapping sequence of L(= 15) frames. However, some trajectories can continue for varying periods beyond the fixed length of L. The ordered trajectories technique generates all such trajectories by matching the dense trajectories of every two consecutive frames. 1http://lear.inrialpes.fr/people/wang/dense_ trajectories Local descriptors – Motion Bound Histograms (MBH), Histograms of Oriented Gradients (HOG), Histogram of Optical Flow (HOF) – of the selected matching trajectories are accumulated, while the Trajectory Shape descriptor is computed from the generated ordered trajectories as follows. For a trajectory of given length L (number of frames) and containing a sequence of points Pt = (xt ,yt), the trajectory shape is described in terms of a sequence of displacement vectors ∆Pt = (Pt+1− pt) = (xt+1−xt ,yt+1−yt). The resulting vector is normalised by the sum of displacement vector magnitudes T = ∆Pt , ...,∆Pt+L−1 ∑t+L−1 j=t ||∆Pj|| (1) 2. Improved Trajectories Improved trajectories have been recently proposed and the code released by Wang et al. [5]. It is an improved version of the dense trajectories obtained by estimating the camera motion, which is estimated by matching feature points between frames using SURF descriptors and dense optical flow. The obtained matches are used to estimate a homography with RANSAC. Further, a human body detector is used to separate motion stemming from humans moving and from camera motion. The estimate is also used to cancel out possible camera motion from the optical flow. This technique has been shown to significantly improve motion-based descriptors such as MBH and HOF by removing apparent motion not related to the human body. In our experiments, we only use the camera motion compensated improved trajectories, without any human body detector. It is available online2 2http://lear.inrialpes.fr/people/wang/improved_

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Human Action Recognition Using Improved Salient Dense Trajectories

Human action recognition in videos is a topic of active research in computer vision. Dense trajectory (DT) features were shown to be efficient for representing videos in state-of-the-art approaches. In this paper, we present a more effective approach of video representation using improved salient dense trajectories: first, detecting the motion salient region and extracting the dense trajectorie...

متن کامل

Human Action Recognition Without Human

The objective of this paper is to evaluate “human action recognition without human”. Motion representation is frequently discussed in human action recognition. We have examined several sophisticated options, such as dense trajectories (DT) and the two-stream convolutional neural network (CNN). However, some features from the background could be too strong, as shown in some recent studies on hum...

متن کامل

UTS-CMU at THUMOS 2015

This notebook paper describes our solution from UTSCMU team in the THUMOS 2015 action recognition challenge. Our system contains two major components, video representation generated by VLAD encoding from ConvNet features and multi-skip improved Dense Trajectories. In addition, we explore optical flow ConvNet and acoustic features such as MFCC and ASR in our system. We demonstrate that our compl...

متن کامل

Depth2Action: Exploring Embedded Depth for Large-Scale Action Recognition

This paper performs the first investigation into depth for large-scale human action recognition in video where the depth cues are estimated from the videos themselves. We develop a new framework called depth2action and experiment thoroughly into how best to incorporate the depth information. We introduce spatio-temporal depth normalization (STDN) to enforce temporal consistency in our estimated...

متن کامل

Study of Human Action Recognition Based on Improved Spatio-temporal Features

Most of the existed action recognition methods mainly utilize spatio-temporal descriptors of single interest point ignoring their potential integral information, such as spatial distribution information. By combining local spatio-temporal feature and global positional distribution information (PDI) of interest points,a novel motion descriptor is proposed in this paper. The proposed method detec...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013